924 research outputs found
Estimation in a discrete tail rate family of recapture sampling models
In the context of recapture sampling design for debugging experiments the problem of estimating the error or hitting rate of the faults remaining in a system is considered. Moment estimators are derived for a family of models in which the rate parameters are assumed proportional to the tail probabilities of a discrete distribution on the positive integers. The estimators are shown to be asymptotically normal and fully efficient. Their fixed sample properties are compared, through simulation, with those of the conditional maximum likelihood estimators
Probing TeV scale physics in precision UCN decays
We present the calculation of matrix elements of iso-vector scalar, axial and
tensor charges between a neutron and a proton state on dynamical
HISQ configurations generated by the MILC Collaboration using valence clover
fermions. These matrix elements are needed to probe novel scalar and tensor
interactions in neutron beta-decay that can arise in extensions to the Standard
Model at the TeV scale. Results are presented at one value of the lattice
spacing, fm, and two values of light quarks corresponding to
and MeV. We discuss two sources of systematic errors,
contribution of excited states to these matrix elements and the renormalization
constants, and the efficacy of methods used to control them.Comment: 7 pages; Proceedings of the 31st International Symposium on Lattice
Field Theory - LATTICE 2013. PoS(LATTICE 2013)40
FogGIS: Fog Computing for Geospatial Big Data Analytics
Cloud Geographic Information Systems (GIS) has emerged as a tool for
analysis, processing and transmission of geospatial data. The Fog computing is
a paradigm where Fog devices help to increase throughput and reduce latency at
the edge of the client. This paper developed a Fog-based framework named Fog
GIS for mining analytics from geospatial data. We built a prototype using Intel
Edison, an embedded microprocessor. We validated the FogGIS by doing
preliminary analysis. including compression, and overlay analysis. Results
showed that Fog computing hold a great promise for analysis of geospatial data.
We used several open source compression techniques for reducing the
transmission to the cloud.Comment: 6 pages, 4 figures, 1 table, 3rd IEEE Uttar Pradesh Section
International Conference on Electrical, Computer and Electronics (09-11
December, 2016) Indian Institute of Technology (Banaras Hindu University)
Varanasi, Indi
Massive gas gangrene secondary to occult colon carcinoma
AbstractGas gangrene is a rare but often fatal soft-tissue infection. Because it is uncommon and the classic symptom of crepitus does not appear until the infection is advanced, prompt diagnosis requires a high index of suspicion. We present a case report of a middle-aged man who presented with acute onset lower-extremity pain that was initially thought to be due to deep vein thrombosis. After undergoing workup for pulmonary embolism, he was found to have massive gas gangrene of the lower extremity secondary to an occult colon adenocarcinoma and died within hours of presentation from multisystem organ failure
pion scattering amplitude with Wilson fermions
We present an exploratory calculation of the scattering
amplitude at threshold using Wilson fermions in the quenched approximation,
including all the required contractions. We find good agreement with the
predictions of chiral perturbation theory even for pions of mass 560-700 MeV.
Within the 10\% errors, we do not see the onset of the bad chiral behavior
expected for Wilson fermions. We also derive rigorous inequalities that apply
to 2-particle correlators and as a consequence show that the interaction in the
antisymmetric state of two pions has to be attractive.Comment: This PS file includes 4 tables and figures 1-8 on 25 pages. Los
Alamos Preprint Number LAUR-92-364
Hybrid Approach for Resource Allocation in Cloud Infrastructure Using Random Forest and Genetic Algorithm
In cloud computing, the virtualization technique is a significant technology to optimize the power consumption of the cloud data center. In this generation, most of the services are moving to the cloud resulting in increased load on data centers. As a result, the size of the data center grows and hence there is more energy consumption. To resolve this issue, an efficient optimization algorithm is required for resource allocation. In this work, a hybrid approach for virtual machine allocation based on genetic algorithm (GA) and the random forest (RF) is proposed which belongs to a class of supervised machine learning techniques. The aim of the work is to minimize power consumption while maintaining better load balance among available resources and maximizing resource utilization. The proposed model used a genetic algorithm to generate a training dataset for the random forest model and further get a trained model. The real-time workload traces from PlanetLab are used to evaluate the approach. The results showed that the proposed GA-RF model improves energy consumption, execution time, and resource utilization of the data center and hosts as compared to the existing models. The work used power consumption, execution time, resource utilization, average start time, and average finish time as performance metrics
- …